Resource Type

Journal Article 1

Year

2022 1

Keywords

Closed-loop transcription 1

Deep networks 1

Intelligence 1

Parsimony 1

Rate reduction 1

Self-consistency 1

open ︾

Search scope:

排序: Display mode:

On the principles of Parsimony and Self-consistency for the emergence of intelligence Position Paper

Yi MA, Doris TSAO, Heung-Yeung SHUM

Frontiers of Information Technology & Electronic Engineering 2022, Volume 23, Issue 9,   Pages 1298-1323 doi: 10.1631/FITEE.2200297

Abstract: Ten years into the revival of and artificial , we propose a theoretical framework that sheds light on understanding within a bigger picture of in general. We introduce two fundamental principles, and , which address two fundamental questions regarding : what to learn and how to learn, respectively. We believe the two principles serve as the cornerstone for the emergence of , artificial or natural. While they have rich classical roots, we argue that they can be stated anew in entirely measurable and computable ways. More specifically, the two principles lead to an effective and efficient computational framework, compressive , which unifies and explains the evolution of modern and most practices of artificial . While we use mainly visual data modeling as an example, we believe the two principles will unify understanding of broad families of autonomous intelligent systems and provide a framework for understanding the brain.

Keywords: Intelligence     Parsimony     Self-consistency     Rate reduction     Deep networks     Closed-loop transcription    

Title Author Date Type Operation

On the principles of Parsimony and Self-consistency for the emergence of intelligence

Yi MA, Doris TSAO, Heung-Yeung SHUM

Journal Article